The Problem of Non-Differentiability
The standard constrained optimization problem is defined as:
$$\text{minimize } f_0(x) \\ \text{subject to } f_i(x) \leq 0, \quad i = 1, \ldots, m \\ Ax = b$$
We could theoretically rewrite this using the indicator function $I_-(u)$ to absorb constraints into the objective. However, $I_-(u)$ is a monster for calculus:
$$I_-(u) = \begin{cases} 0 & u \leq 0 \\ \infty & u > 0 \end{cases}$$
Because it is discontinuous and has an infinite gradient at the boundary, we cannot compute the Hessian required for Newton's Method. We need a differentiable surrogate.
The Logarithmic Smoothing
We approximate $I_-(u)$ using the function:
$$\hat{I}_-(u) = -(1/t) \log(-u), \quad \text{dom } \hat{I}_- = -\mathbf{R}_{++}$$
Here, $t > 0$ is a parameter that dictates the accuracy of our approximation. The larger $t$ becomes, the more the barrier looks like the true indicator function.
Unlike active-set methods, this approach requires that every iterate $x$ remains strictly feasible ($f_i(x) < 0$). Because the logarithm is undefined for non-negative values, it creates an "impenetrable" barrier that keeps the search inside the interior of the feasible set.